home *** CD-ROM | disk | FTP | other *** search
-
-
- History of the Computer Industry in America
- America and the Computer Industry
-
- Only once in a lifetime will a new invention come about to touch
- every aspect of our lives. Such a device that changes the way we work,
- live, and play is a special one, indeed. A machine that has done all
- this and more now exists in nearly every business in the U.S. and one
- out of every two households (Hall, 156). This incredible invention is
- the computer. The electronic computer has been around for over a
- half-century, but its ancestors have been around for 2000 years.
- However, only in the last 40 years has it changed the American society.
- >From the first wooden abacus to the latest high-speed microprocessor,
- the computer has changed nearly every aspect of people╒s lives for the
- better.
- The very earliest existence of the modern day computer╒s
- ancestor is the abacus. These date back to almost 2000 years ago. It
- is simply a wooden rack holding parallel wires on which beads are
- strung. When these beads are moved along the wire according to
- "programming" rules that the user must memorize, all ordinary arithmetic
- operations can be performed (Soma, 14). The next innovation in
- computers took place in 1694 when Blaise Pascal invented the first
- ╥digital calculating machine╙. It could only add numbers and they had
- to be entered by turning dials. It was designed to help Pascal╒s father
- who was a tax collector (Soma, 32).
- In the early 1800╒s, a mathematics professor named Charles
- Babbage designed an automatic calculation machine. It was steam powered
- and could store up to 1000 50-digit numbers. Built in to his machine
- were operations that included everything a modern general-purpose
- computer would need. It was programmed by--and stored data on--cards
- with holes punched in them, appropriately called ╥punchcards╙. His
- inventions were failures for the most part because of the lack of
- precision machining techniques used at the time and the lack of demand
- for such a device (Soma, 46).
- After Babbage, people began to lose interest in computers.
- However, between 1850 and 1900 there were great advances in mathematics
- and physics that began to rekindle the interest (Osborne, 45). Many of
- these new advances involved complex calculations and formulas that were
- very time consuming for human calculation. The first major use for a
- computer in the U.S. was during the 1890 census. Two men, Herman
- Hollerith and James Powers, developed a new punched-card system that
- could automatically read information on cards without human intervention
- (Gulliver, 82). Since the population of the U.S. was increasing so
- fast, the computer was an essential tool in tabulating the totals.
- These advantages were noted by commercial industries and soon
- led to the development of improved punch-card business-machine systems
- by International Business Machines (IBM), Remington-Rand, Burroughs, and
- other corporations. By modern standards the punched-card machines were
- slow, typically processing from 50 to 250 cards per minute, with each
- card holding up to 80 digits. At the time, however, punched cards were
- an enormous step forward; they provided a means of input, output, and
- memory storage on a massive scale. For more than 50 years following
- their first use, punched-card machines did the bulk of the world's
- business computing and a good portion of the computing work in science
- (Chposky, 73).
- By the late 1930s punched-card machine techniques had become so
- well established and reliable that Howard Hathaway Aiken, in
- collaboration with engineers at IBM, undertook construction of a large
- automatic digital computer based on standard IBM electromechanical
- parts. Aiken's machine, called the Harvard Mark I, handled 23-digit
- numbers and could perform all four arithmetic operations. Also, it had
- special built-in programs to handle logarithms and trigonometric
- functions. The Mark I was controlled from prepunched paper tape.
- Output was by card punch and electric typewriter. It was slow,
- requiring 3 to 5 seconds for a multiplication, but it was fully
- automatic and could complete long computations without human
- intervention (Chposky, 103).
- The outbreak of World War II produced a desperate need for
- computing capability, especially for the military. New weapons systems
- were produced which needed trajectory tables and other essential data.
- In 1942, John P. Eckert, John W. Mauchley, and their associates at the
- University of Pennsylvania decided to build a high-speed electronic
- computer to do the job. This machine became known as ENIAC, for
- "Electrical Numerical Integrator And Calculator". It could multiply two
- numbers at the rate of 300 products per second, by finding the value of
- each product from a multiplication table stored in its memory. ENIAC was
- thus about 1,000 times faster than the previous generation of computers
- (Dolotta, 47).
- ENIAC used 18,000 standard vacuum tubes, occupied 1800 square
- feet of floor space, and used about 180,000 watts of electricity. It
- used punched-card input and output. The ENIAC was very difficult to
- program because one had to essentially re-wire it to perform whatever
- task he wanted the computer to do. It was, however, efficient in
- handling the particular programs for which it had been designed. ENIAC
- is generally accepted as the first successful high-speed electronic
- digital computer and was used in many applications from 1946 to 1955
- (Dolotta, 50).
- Mathematician John von Neumann was very interested in the ENIAC.
- In 1945 he undertook a theoretical study of computation that
- demonstrated that a computer could have a very simple and yet be able to
- execute any kind of computation effectively by means of proper
- programmed control without the need for any changes in hardware. Von
- Neumann came up with incredible ideas for methods of building and
- organizing practical, fast computers. These ideas, which came to be
- referred to as the stored-program technique, became fundamental for
- future generations of high-speed digital computers and were universally
- adopted (Hall, 73).
- The first wave of modern programmed electronic computers to take
- advantage of these improvements appeared in 1947. This group included
- computers using random access memory (RAM), which is a memory designed
- to give almost constant access to any particular piece of information
- (Hall, 75). These machines had punched-card or punched-tape input and
- output devices and RAMs of 1000-word capacity. Physically, they were
- much more compact than ENIAC: some were about the size of a grand piano
- and required 2500 small electron tubes. This was quite an improvement
- over the earlier machines. The first-generation stored-program
- computers required considerable maintenance, usually attained 70% to 80%
- reliable operation, and were used for 8 to 12 years. Typically, they
- were programmed directly in machine language, although by the mid-1950s
- progress had been made in several aspects of advanced programming. This
- group of machines included EDVAC and UNIVAC, the first commercially
- available computers (Hazewindus, 102).
- The UNIVAC was developed by John W. Mauchley and John Eckert,
- Jr. in the 1950╒s. Together they had formed the Mauchley-Eckert
- Computer Corporation, America╒s first computer company in the 1940╒s.
- During the development of the UNIVAC, they began to run short on funds
- and sold their company to the larger Remington-Rand Corporation.
- Eventually they built a working UNIVAC computer. It was delivered to
- the U.S. Census Bureau in 1951 where it was used to help tabulate the
- U.S. population (Hazewindus, 124).
- Early in the 1950s two important engineering discoveries changed
- the electronic computer field. The first computers were made with
- vacuum tubes, but by the late 1950╒s computers were being made out of
- transistors, which were smaller, less expensive, more reliable, and more
- efficient (Shallis, 40). In 1959, Robert Noyce, a physicist at the
- Fairchild Semiconductor Corporation, invented the integrated circuit, a
- tiny chip of silicon that contained an entire electronic circuit. Gone
- was the bulky, unreliable, but fast machine; now computers began to
- become more compact, more reliable and have more capacity (Shallis, 49).
- These new technical discoveries rapidly found their way into new
- models of digital computers. Memory storage capacities increased 800%
- in commercially available machines by the early 1960s and speeds
- increased by an equally large margin. These machines were very
- expensive to purchase or to rent and were especially expensive to
- operate because of the cost of hiring programmers to perform the complex
- operations the computers ran. Such computers were typically found in
- large computer centers--operated by industry, government, and private
- laboratories--staffed with many programmers and support personnel
- (Rogers, 77). By 1956, 76 of IBM╒s large computer mainframes were in
- use, compared with only 46 UNIVAC╒s (Chposky, 125).
- In the 1960s efforts to design and develop the fastest possible
- computers with the greatest capacity reached a turning point with the
- completion of the LARC machine for Livermore Radiation Laboratories by
- the Sperry-Rand Corporation, and the Stretch computer by IBM. The LARC
- had a core memory of 98,000 words and multiplied in 10 microseconds.
- Stretch was provided with several ranks of memory having slower access
- for the ranks of greater capacity, the fastest access time being less
- than 1 microseconds and the total capacity in the vicinity of 100
- million words (Chposky, 147).
- During this time the major computer manufacturers began to offer
- a range of computer capabilities, as well as various computer-related
- equipment. These included input means such as consoles and card
- feeders; output means such as page printers, cathode-ray-tube displays,
- and graphing devices; and optional magnetic-tape and magnetic-disk file
- storage. These found wide use in business for such applications as
- accounting, payroll, inventory control, ordering supplies, and billing.
- Central processing units (CPUs) for such purposes did not need to be
- very fast arithmetically and were primarily used to access large amounts
- of records on file. The greatest number of computer systems were
- delivered for the larger applications, such as in hospitals for keeping
- track of patient records, medications, and treatments given. They were
- also used in automated library systems and in database systems such as
- the Chemical Abstracts system, where computer records now on file cover
- nearly all known chemical compounds (Rogers, 98).
- The trend during the 1970s was, to some extent, away from
- extremely powerful, centralized computational centers and toward a
- broader range of applications for less-costly computer systems. Most
- continuous-process manufacturing, such as petroleum refining and
- electrical-power distribution systems, began using computers of
- relatively modest capability for controlling and regulating their
- activities. In the 1960s the programming of applications problems was
- an obstacle to the self-sufficiency of moderate-sized on-site computer
- installations, but great advances in applications programming languages
- removed these obstacles. Applications languages became available for
- controlling a great range of manufacturing processes, for computer
- operation of machine tools, and for many other tasks (Osborne, 146). In
- 1971 Marcian E. Hoff, Jr., an engineer at the Intel Corporation,
- invented the microprocessor and another stage in the deveopment of the
- computer began (Shallis, 121).
- A new revolution in computer hardware was now well under way,
- involving miniaturization of computer-logic circuitry and of component
- manufacture by what are called large-scale integration techniques. In
- the 1950s it was realized that "scaling down" the size of electronic
- digital computer circuits and parts would increase speed and efficiency
- and improve performance. However, at that time the manufacturing
- methods were not good enough to accomplish such a task. About 1960
- photoprinting of conductive circuit boards to eliminate wiring became
- highly developed. Then it became possible to build resistors and
- capacitors into the circuitry by photographic means (Rogers, 142). In
- the 1970s entire assemblies, such as adders, shifting registers, and
- counters, became available on tiny chips of silicon. In the 1980s very
- large scale integration (VLSI), in which hundreds of thousands of
- transistors are placed on a single chip, became increasingly common.
- Many companies, some new to the computer field, introduced in the 1970s
- programmable minicomputers supplied with software packages. The
- size-reduction trend continued with the introduction of personal
- computers, which are programmable machines small enough and inexpensive
- enough to be purchased and used by individuals (Rogers, 153).
- One of the first of such machines was introduced in January
- 1975. Popular Electronics magazine provided plans that would allow any
- electronics wizard to build his own small, programmable computer for
- about $380 (Rose, 32). The computer was called the ╥Altair 8800╙. Its
- programming involved pushing buttons and flipping switches on the front
- of the box. It didn╒t include a monitor or keyboard, and its
- applications were very limited (Jacobs, 53). Even though, many orders
- came in for it and several famous owners of computer and software
- manufacturing companies got their start in computing through the Altair.
- For example, Steve Jobs and Steve Wozniak, founders of Apple Computer,
- built a much cheaper, yet more productive version of the Altair and
- turned their hobby into a business (Fluegelman, 16).
- After the introduction of the Altair 8800, the personal computer
- industry became a fierce battleground of competition. IBM had been the
- computer industry standard for well over a half-century. They held
- their position as the standard when they introduced their first personal
- computer, the IBM Model 60 in 1975 (Chposky, 156). However, the newly
- formed Apple Computer company was releasing its own personal computer,
- the Apple II (The Apple I was the first computer designed by Jobs and
- Wozniak in Wozniak╒s garage, which was not produced on a wide scale).
- Software was needed to run the computers as well. Microsoft developed a
- Disk Operating System (MS-DOS) for the IBM computer while Apple
- developed its own software system (Rose, 37). Because Microsoft had now
- set the software standard for IBMs, every software manufacturer had to
- make their software compatible with Microsoft╒s. This would lead to
- huge profits for Microsoft (Cringley, 163).
- The main goal of the computer manufacturers was to make the
- computer as affordable as possible while increasing speed, reliability,
- and capacity. Nearly every computer manufacturer accomplished this and
- computers popped up everywhere. Computers were in businesses keeping
- track of inventories. Computers were in colleges aiding students in
- research. Computers were in laboratories making complex calculations at
- high speeds for scientists and physicists. The computer had made its
- mark everywhere in society and built up a huge industry (Cringley, 174).
- The future is promising for the computer industry and its
- technology. The speed of processors is expected to double every year
- and a half in the coming years. As manufacturing techniques are further
- perfected the prices of computer systems are expected to steadily fall.
- However, since the microprocessor technology will be increasing, it╒s
- higher costs will offset the drop in price of older processors. In other
- words, the price of a new computer will stay about the same from year to
- year, but technology will steadily increase (Zachary, 42)
- Since the end of World War II, the computer industry has grown
- from a standing start into one of the biggest and most profitable
- industries in the United States. It now comprises thousands of
- companies, making everything from multi-million dollar high-speed
- supercomputers to printout paper and floppy disks. It employs millions
- of people and generates tens of billions of dollars in sales each year
- (Malone, 192). Surely, the computer has impacted every aspect of
- people╒s lives. It has affected the way people work and play. It has
- made everyone╒s life easier by doing difficult work for people. The
- computer truly is one of the most incredible inventions in history.
-
-
-
-
-
-
- Works Cited
-
- Chposky, James. Blue Magic. New York: Facts on File Publishing. 1988.
- Cringley, Robert X. Accidental Empires. Reading, MA: Addison Wesley
- Publishing, 1992.
- Dolotta, T.A. Data Processing: 1940-1985. New York: John Wiley & Sons,
- 1985.
- Fluegelman, Andrew. ╥A New World╙, MacWorld. San Jose, Ca: MacWorld
- Publishing, February, 1984 (Premire Issue).
- Hall, Peter. Silicon Landscapes. Boston: Allen & Irwin, 1985
- Gulliver, David. Silicon Valey and Beyond. Berkeley, Ca: Berkeley Area
- Government Press, 1981.
- Hazewindus, Nico. The U.S. Microelectronics Industry. New York:
- Pergamon Press, 1988.
- Jacobs, Christopher W. ╥The Altair 8800╙, Popular Electronics. New
- York: Popular Electronics Publishing, January 1975.
- Malone, Michael S. The Big Scare: The U.S. Coputer Industry. Garden
- City, NY: Doubleday & Co., 1985.
- Osborne, Adam. Hypergrowth. Berkeley, Ca: Idthekkethan Publishing
- Company, 1984.
- Rogers, Everett M. Silicon Valey Fever. New York: Basic Books, Inc.
- Publishing, 1984.
- Rose, Frank. West of Eden. New York: Viking Publishing, 1989.
- Shallis, Michael. The Silicon Idol. New York: Shocken Books, 1984.
- Soma, John T. The History of the Computer. Toronto: Lexington Books,
- 1976.
- Zachary, William. ╥The Future of Computing╙, Byte. Boston: Byte
- Publishing, August 1994.